298 research outputs found

    Computing a k-sparse n-length Discrete Fourier Transform using at most 4k samples and O(k log k) complexity

    Full text link
    Given an nn-length input signal \mbf{x}, it is well known that its Discrete Fourier Transform (DFT), \mbf{X}, can be computed in O(nlogn)O(n \log n) complexity using a Fast Fourier Transform (FFT). If the spectrum \mbf{X} is exactly kk-sparse (where k<<nk<<n), can we do better? We show that asymptotically in kk and nn, when kk is sub-linear in nn (precisely, knδk \propto n^{\delta} where 0<δ<10 < \delta <1), and the support of the non-zero DFT coefficients is uniformly random, we can exploit this sparsity in two fundamental ways (i) {\bf {sample complexity}}: we need only M=rkM=rk deterministically chosen samples of the input signal \mbf{x} (where r<4r < 4 when 0<δ<0.990 < \delta < 0.99); and (ii) {\bf {computational complexity}}: we can reliably compute the DFT \mbf{X} using O(klogk)O(k \log k) operations, where the constants in the big Oh are small and are related to the constants involved in computing a small number of DFTs of length approximately equal to the sparsity parameter kk. Our algorithm succeeds with high probability, with the probability of failure vanishing to zero asymptotically in the number of samples acquired, MM.Comment: 36 pages, 15 figures. To be presented at ISIT-2013, Istanbul Turke

    Exact Regeneration Codes for Distributed Storage Repair Using Interference Alignment

    Full text link
    The high repair cost of (n,k) Maximum Distance Separable (MDS) erasure codes has recently motivated a new class of codes, called Regenerating Codes, that optimally trade off storage cost for repair bandwidth. On one end of this spectrum of Regenerating Codes are Minimum Storage Regenerating (MSR) codes that can match the minimum storage cost of MDS codes while also significantly reducing repair bandwidth. In this paper, we describe Exact-MSR codes which allow for any failed nodes (whether they are systematic or parity nodes) to be regenerated exactly rather than only functionally or information-equivalently. We show that Exact-MSR codes come with no loss of optimality with respect to random-network-coding based MSR codes (matching the cutset-based lower bound on repair bandwidth) for the cases of: (a) k/n <= 1/2; and (b) k <= 3. Our constructive approach is based on interference alignment techniques, and, unlike the previous class of random-network-coding based approaches, we provide explicit and deterministic coding schemes that require a finite-field size of at most 2(n-k).Comment: to be submitted to IEEE Transactions on Information Theor

    On the Existence of Optimal Exact-Repair MDS Codes for Distributed Storage

    Full text link
    The high repair cost of (n,k) Maximum Distance Separable (MDS) erasure codes has recently motivated a new class of codes, called Regenerating Codes, that optimally trade off storage cost for repair bandwidth. In this paper, we address bandwidth-optimal (n,k,d) Exact-Repair MDS codes, which allow for any failed node to be repaired exactly with access to arbitrary d survivor nodes, where k<=d<=n-1. We show the existence of Exact-Repair MDS codes that achieve minimum repair bandwidth (matching the cutset lower bound) for arbitrary admissible (n,k,d), i.e., k<n and k<=d<=n-1. Our approach is based on interference alignment techniques and uses vector linear codes which allow to split symbols into arbitrarily small subsymbols.Comment: 20 pages, 6 figure

    High-resolution distributed sampling of bandlimited fields with low-precision sensors

    Full text link
    The problem of sampling a discrete-time sequence of spatially bandlimited fields with a bounded dynamic range, in a distributed, communication-constrained, processing environment is addressed. A central unit, having access to the data gathered by a dense network of fixed-precision sensors, operating under stringent inter-node communication constraints, is required to reconstruct the field snapshots to maximum accuracy. Both deterministic and stochastic field models are considered. For stochastic fields, results are established in the almost-sure sense. The feasibility of having a flexible tradeoff between the oversampling rate (sensor density) and the analog-to-digital converter (ADC) precision, while achieving an exponential accuracy in the number of bits per Nyquist-interval per snapshot is demonstrated. This exposes an underlying ``conservation of bits'' principle: the bit-budget per Nyquist-interval per snapshot (the rate) can be distributed along the amplitude axis (sensor-precision) and space (sensor density) in an almost arbitrary discrete-valued manner, while retaining the same (exponential) distortion-rate characteristics. Achievable information scaling laws for field reconstruction over a bounded region are also derived: With N one-bit sensors per Nyquist-interval, Θ(logN)\Theta(\log N) Nyquist-intervals, and total network bitrate Rnet=Θ((logN)2)R_{net} = \Theta((\log N)^2) (per-sensor bitrate Θ((logN)/N)\Theta((\log N)/N)), the maximum pointwise distortion goes to zero as D=O((logN)2/N)D = O((\log N)^2/N) or D=O(Rnet2βRnet)D = O(R_{net} 2^{-\beta \sqrt{R_{net}}}). This is shown to be possible with only nearest-neighbor communication, distributed coding, and appropriate interpolation algorithms. For a fixed, nonzero target distortion, the number of fixed-precision sensors and the network rate needed is always finite.Comment: 17 pages, 6 figures; paper withdrawn from IEEE Transactions on Signal Processing and re-submitted to the IEEE Transactions on Information Theor

    Efficient File Synchronization: a Distributed Source Coding Approach

    Full text link
    The problem of reconstructing a source sequence with the presence of decoder side-information that is mis-synchronized to the source due to deletions is studied in a distributed source coding framework. Motivated by practical applications, the deletion process is assumed to be bursty and is modeled by a Markov chain. The minimum rate needed to reconstruct the source sequence with high probability is characterized in terms of an information theoretic expression, which is interpreted as the amount of information of the deleted content and the locations of deletions, subtracting "nature's secret", that is, the uncertainty of the locations given the source and side-information. For small bursty deletion probability, the asymptotic expansion of the minimum rate is computed.Comment: 9 pages, 2 figures. A shorter version will appear in IEEE International Symposium on Information Theory (ISIT), 201
    corecore